Universality of approximate message passing algorithms
نویسندگان
چکیده
We consider a broad class of Approximate Message Passing (AMP) algorithms defined as Lipschitzian functional iteration in terms an n×n random symmetric matrix A. establish universality noise for this AMP the n-limit and validate behavior number AMPs popularly adapted compressed sensing, statistical inferences, optimizations spin glasses.
منابع مشابه
Universality in Polytope Phase Transitions and Message Passing Algorithms
We consider a class of nonlinear mappings FA,N in R N indexed by symmetric random matrices A ∈ R with independent entries. Within spin glass theory, special cases of these mappings correspond to iterating the TAP equations and were studied by Erwin Bolthausen. Within information theory, they are known as ‘approximate message passing’ algorithms. We study the high-dimensional (large N) behavior ...
متن کاملApproximate Message Passing
In this note, I summarize Sections 5.1 and 5.2 of Arian Maleki’s PhD thesis. 1 Notation We denote scalars by small letters e.g. a, b, c, . . ., vectors by boldface small letters e.g. λ,α,x, . . ., matrices by boldface capital letter e.g. A,B,C, . . ., (subsets of) natural numbers by capital letters e.g. N,M, . . .. We denote i’th element of a vector a by ai and (i, j)’th entry of a matrix A by ...
متن کاملApproximate Message Passing Algorithms for Generalized Bilinear Inference
Recent developments in compressive sensing (CS) combined with increasing demands for effective high-dimensional inference techniques across a variety of disciplines have motivated extensive research into algorithms exploiting various notions of parsimony, including sparsity and low-rank constraints. In this dissertation, we extend the generalized approximate message passing (GAMP) approach, ori...
متن کاملParameterless Optimal Approximate Message Passing
Iterative thresholding algorithms are well-suited for high-dimensional problems in sparse recovery and compressive sensing. The performance of this class of algorithms depends heavily on the tuning of certain threshold parameters. In particular, both the final reconstruction error and the convergence rate of the algorithm crucially rely on how the threshold parameter is set at each step of the ...
متن کاملBilinear Generalized Approximate Message Passing
We extend the generalized approximate message passing (G-AMP) approach, originally proposed for highdimensional generalized-linear regression in the context of compressive sensing, to the generalized-bilinear case, which enables its application to matrix completion, robust PCA, dictionary learning, and related matrix-factorization problems. In the first part of the paper, we derive our Bilinear...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Electronic Journal of Probability
سال: 2021
ISSN: ['1083-6489']
DOI: https://doi.org/10.1214/21-ejp604